Pipelined Neural Tree Learning by Error Forward-Propagation

نویسنده

  • Alois P. Heinz
چکیده

We propose a new parallel implementation of the neural tree feed-forward network architecture that supports eecient evaluation and learning regardless of the number of layers. The neurons of each layer operate in parallel and the layers are the elements of a pipeline that computes the output evaluation vectors for a sequence of input pattern vectors at a rate of one per time step. During the learning phase the desired outputs are presented as additional inputs and the pipeline computes in feed-forward manner the gradients of the errors with respect to the neuron evaluations. Thus it is possible to run diierent gradient descent learning algorithms on the pipeline with a performance comparable to the evaluation algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hardware-Efficient On-line Learning through Pipelined Truncated-Error Backpropagation in Binary-State Networks

Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures that have achieved state-of-the-art performance in various benchmarks. Significant effort has been devoted to developing custom silicon devices to accelerate inference in ANNs. Accelerating the training phase, however, has attracted relatively little attention. In this paper, we describe a hardwa...

متن کامل

Global Solar Radiation Prediction for Makurdi, Nigeria Using Feed Forward Backward Propagation Neural Network

The optimum design of solar energy systems strongly depends on the accuracy of  solar radiation data. However, the availability of accurate solar radiation data is undermined by the high cost of measuring equipment or non-functional ones. This study developed a feed-forward backpropagation artificial neural network model for prediction of global solar radiation in Makurdi, Nigeria (7.7322  N lo...

متن کامل

A novel fast learning algorithms for time-delay neural networks

To counter the drawbacks that Waibel 's time-delay neural networks (TDW) take up long training time in phoneme recognition, the paper puts forward several improved fast learning methods of 1PW. Merging unsupervised Oja's rule and the similar error back propagation algorithm for initial training of 1PhW weights can effectively increase convergence speed, at the same time error firnction almost m...

متن کامل

Handwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns

The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...

متن کامل

Comparison of Artificial Neural Network and Multiple Regression Analysis for Prediction of Fat Tail Weight of Sheep

A comparative study of artificial neural network (ANN) and multiple regression is made to predict the fat tail weight of Balouchi sheep from birth, weaning and finishing weights. A multilayer feed forward network with back propagation of error learning mechanism was used to predict the sheep body weight. The data (69 records) were randomly divided into two subsets. The first subset is the train...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995